Neural Disambiguation of Causal Lexical Markers Based on Context
نویسندگان
چکیده
Causation is a psychological tool of humans to understand the world and it is projected in natural language. Causation relates two events, so in order to understand the causal relation of those events and the causal reasoning of humans, the study of causality classification is required. We claim that the use of linguistic features may restrict the representation of causality, and dense vector spaces can provide a better encoding of the causal meaning of an utterance. Herein, we propose a neural network architecture only fed with word embeddings for the task of causality classification. Our results show that our claim holds, and we outperform the state-of-the-art on the AltLex corpus. The source code of our experiments is publicly available.1
منابع مشابه
Lexical Disambiguation in LTAG Using Left Context
In this paper, we present an automaton-based lexical disambiguation process for Lexicalized Tree-Adjoining Grammar (LTAG). This process builds on previous work of Bonfante et al. (2004), and extends it by computing a polarity-based abstraction, which contains information about left context. This extension allows for a faster lexical disambiguation by reducing the filtering automaton.
متن کاملSemi-Automatic Data Annotation, POS Tagging and Mildly Context-Sensitive Disambiguation: the eXtended Revised AraMorph (XRAM)
An extended, revised form of Tim Buckwalter’s Arabic lexical and morphological resource AraMorph, eXtended Revised AraMorph (henceforth XRAM), is presented which addresses a number of weaknesses and inconsistencies of the original model by allowing a wider coverage of real-world Classical and contemporary (both formal and informal) Arabic texts. Building upon previous research, XRAM enhancement...
متن کاملLexical Disambiguation Based on Distributed Representations of Context Frequency
A model for lexical disambiguation is presented that is based on combining the frequencies of past contexts of ambiguous words. The frequencies are encoded in the word representations and de ne the words' semantics. A Simple Recurrent Network (SRN) parser combines the context frequencies one word at a time, always producing the most likely interpretation of the current sentence at its output. T...
متن کاملcontext2vec: Learning Generic Context Embedding with Bidirectional LSTM
Context representations are central to various NLP tasks, such as word sense disambiguation, named entity recognition, coreference resolution, and many more. In this work we present a neural model for efficiently learning a generic context embedding function from large corpora, using bidirectional LSTM. With a very simple application of our context representations, we manage to surpass or nearl...
متن کاملDisambiguation of Taxonomy Markers in Context: Russian Nouns
The paper presents experimental results on WSD, with focus on disambiguation of Russian nouns that refer to tangible objects and abstract notions. The body of contexts has been extracted from the Russian National Corpus (RNC). The tool used in our experiments is aimed at statistical processing and classification of noun contexts. The WSD procedure takes into account taxonomy markers of word mea...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017